专利摘要:
Determining a number of objects in an image go. The present invention relates to a method and system for determining the number of objects in an IR image obtained by an IR imaging system, the method comprising: collecting a total of n intensity values for each pixel in an IR image. said intensity values having been collected using ir imaging system, comprising an ir detection device and an ir illuminator; retrieving estimated intensity values for a plurality of known materials from a database; determining a rating for each pixel in said ir image using one of a best fit reflectance method and a correlation method; and determining a total number of objects in said image to be based on said pixel classifications.
公开号:BR102012008481A2
申请号:R102012008481
申请日:2012-04-11
公开日:2018-09-18
发明作者:K Mestha Lalit;Rong Wang Yao;Fan Zhigang
申请人:Xerox Corp;
IPC主号:
专利说明:

(54) Title: DETERMINATION OF A NUMBER OF OBJECTS IN AN IR IMAGE (51) Int. Cl .: G06K 9/00; G06K 9/60 (30) Unionist Priority: 4/13/2011 US
13 / 086,006 (73) Holder (s): XEROX CORPORATION (72) Inventor (s): YAO RONG WANG;
ZHIGANG FAN; LALIT K. MESTHA (85) National Phase Start Date:
11/04/2012 (57) Abstract: DETERMINATION OF A NUMBER OF OBJECTS IN AN IR IMAGE. The present invention relates to a method and system for determining the number of objects in an IR image obtained by an IR imaging system, the method comprising: collecting a total of N intensity values for each pixel in an IR image , said intensity values having been collected using an IR imaging system, comprising an IR detection device and an IR illuminator; retrieving estimated intensity values for a plurality of materials known from a database; determine a classification for each pixel in that IR image, using one of the best reflectance adjustment methods and the correlation method; and determining a total number of objects in said IR image based on said pixel classifications.

1/13
Invention Patent Descriptive Report for DETERMINING A NUMBER OF OBJECTS IN AN IR IMAGE.
Technical Field
The present invention relates to systems and methods which use an infrared camera system to obtain a multiband IR image and then determine the total number of objects in that IR image. ι Background i There are many commercial face recognition systems (FRS) commercially available. However, many of these systems require face placement in front of a camera and good lighting. These conditions are often not available where face recognition may be necessary or desired. In many practical situations. The subject being sought for detection and recognition may be in motion while passing in front of an infrared camera, such as, for example, while moving in a car. Still, the subject's lighting may be inadequate. This technique needs methods for determining a number of objects in an image captured using an infrared camera.
Brief Summary
What is described is a new system and method for determining the number of objects in an IR image. The present system and method provide a means for separating objects from the surrounding background. This system finds its intended use in a wide array of applications in the real world, such as, for example, determining the number of occupants in a vehicle traveling on a HOV / HOT lane or in an image captured by a security camera. safety.
In an example embodiment, the present system and method involves the following. First, using an IR imaging system, a total of N intensity values are collected for each pixel in an IR image. Once the intensity values have been collected, each pixel in the IR image is processed so that a classification can be determined for that pixel. In one embodiment, intensity values are calculated using reflectances that have been estimated for a plurality of known materials, such as, for example, hair and skin. Pixel intensities are then compared with the calculated intensity values and the pixels are classified based on better reflectance by adjustment. In another mode, a correlation coefficient is calculated between each pixel intensity value and the intensity values. recovered. The pixels are then classified based on the amount of correlation between them. Once the pixels have been classified, the total number of objects in the IR image can be determined based on the pixel classifications.
DescriptionIdos Drawings! Figure 1 shows an example vehicle occupancy detection system according to its teachings.
) Figure 2 illustrates an example modality of the present method directed to a modality to determine the number of occupants in a motor vehicle.
j Figures 3-9 show the relative radiant intensity of IR LEDs suitable for use with several of their modalities.
Figure 10 shows correlation coefficients for a 3-band and a 4-band system with η = 0.
Figure 11 shows correlation coefficients for a 3-band and a 4-band system with η = 0.05.
Figure 12 is a table of correlation coefficients for a 3-band and a 4-band system with η = 0 and 20% white noise.
Figure 13 illustrates a block diagram of an example system capable of implementing various aspects of the present method, as shown and described in relation to the flowchart of figure 2.
Detailed Description
What is described is a new system and method for determining the number of objects in an IR image obtained using an IR imaging system.
3/13
A pixel is the smallest addressable element in an image. Each pixel has its own address. Pixels are usually arranged in a grid. The intensity of each pixel is variable and depends on the characteristics and sensitivity of the sensor device being used to measure that pixel. The resolution for a camera is, effectively, the pixel size. Smaller pixels mean that more of them will enter the image, giving better definition.
i An IR image is an image obtained from an IR detection device having detected IR light reflected from an illuminated sample. A fully populated IR image consists of pixels with each having an intensity value in each desired spectral band of interest. Infrared (IR) light is electromagnetic radiation with a wavelength between 0.7 and 300 micrometers. It should be noted that the upper end wavelength for the IR range is not precisely established. This equals a frequency range between 1 and 430 THz. IR wavelengths are longer than the visible light wavelength, but shorter than the microwave. The brightest sunlight provides an irradiance of approximately 1 kilowatt per square meter at sea level. Of this energy, 527 watts are infrared, 445 watts are visible light and 32 watts are ultraviolet. In active infrared, the camera illuminates the scene at infrared wavelengths invisible to the human eye. Infrared energy is only part of the electromagnetic spectrum that involves gamma, X-ray, ultraviolet radiation, a thin region of visible light, terahertz waves, infrared, microwave and radio waves. These are all related and differentiated in their wavelength. Several of its modalities use the entire infrared reflective lower band (LRIR) (that is, ® 800 1400 nanometers). LRIR can be detected with a multiband image formation device (MBI) sensitive to that frequency band and provide images that look like a black and white figure. The upper infrared reflective band (URIR) «1400 - 2200 nanometers). A detected URIR image is not similar to the detected images from
4/13
LIRES because human flesh does not reflect IR in the same way as inanimate objects. Since the lower and upper IR bands are reflective, the scene may need a light source. This lighting does not need to be visible and so it will not be a distraction for humans, nowadays, the lighting of LRIR and URIR may be unnecessary because sufficient IR lighting can be provided by ordinary sunlight.
i An IR illuminator is a light source. Light levels can be controlled by varying the transmission currents. For example, the optical output of LEDs varies linearly with the current. Arrangements of LEDs capable of IR illumination in a sequential mode over time or simultaneously are well known.
í An IR Image Training System is a device designed to capture IR light reflected from a target object, separate it into its component wavelengths and output an IR image of the target. These systems have an IR detector (such as an IR camera) and an IR illuminator. An example of an example IR detection system is shown in figure 3. An IR imaging system can be a single IR detection device and an N-band illuminator, lit sequentially (N s 3) with a filter fixed or comprise a total of N detection devices (N> 3), each having a respective bandpass filter and a single light source.
Reference is now being made to figure 1, which shows an example vehicle occupancy detection system.
In FIG. 1, target vehicle 100 contains a human occupant traveling at speed comes a direction of movement indicated by directional vector 103 along the track of HOV 104. Positioned at a desired distance d above track 104 is support arm 105 comprising a construction tube similar to that used for traffic lights. Attached to the arm 105 are the IR detection system 107 having a transmission element 108 for communication with a remote device, and the IR lighting system 109. Detection device 107 can comprise a camera equipped with a telephoto lens, a bandpass filter, and a polarizing lens to reduce a glare effect. During daytime operation, illumination by the sun may be sufficient. IR illuminator 109 emits IR radiation at one or more wavelengths that are reflected from j
returns to detector 207 from the target vehicle and its contents. IR detection system 107 transmits the IR image and / or intensity values associated with each pixel in the IR image to a computing device for further processing in a manner that will be described below.
i Reference is now being made to the flowchart of figure 5, which illustrates an example modality of the present method for determining the number of objects in a motor vehicle. It will be appreciated that, although this modality is discussed in the context of a transport management system, its teachings are intended to find its uses in a wide array of systems in which the determination of a number of objects in an IR image obtained using a system IR imaging is desired. These modalities are intended to be within the scope of the attached claims. Flow processing starts at 200 and immediately proceeds to step 202.
In step 202, a total of N intensity values are collected for each pixel in an IR image. The intensity values for each pixel are collected from the reflected IR light source using the image formation system of figure 1. The IR image and / or intensity values for each pixel can be provided for a computer workstation or a special-purpose computer system for further processing according to its various modalities. In the following, it is assumed that the IR attenuation in the air and the integration time are the same for all bands. If not, these factors will be adjusted accordingly.
If the IR detection device is a single IR detection device and the IR illuminator is a sequentially illuminated N-band illuminator (N> 3), with a fixed filter, the intensity value comprises:
6/13
I c (i) + + (1) where i = 1 ... N, so that i is the i th IR bandpass filter of the illuminator that is lighting sequentially, α is a constant that depends on an angle and the distance from the light source, an attenuation of an IR wave in the air and an integration time of the detection device θ the intensity of the i th band of the light source, l b is an intensity of a backlight source, like that of the IR component of sunlight, R 0 (A) is the reflectance of an object inside the vehicle, Rg (A) and Tg (A) are a reflectance and a transmittance of the glass, constant η is a me - 10 given the percentage of light from the illuminator reflected from the vehicle glass and received by the detector, Tl (A) is a fixed filter transmittance and D (A) is the responsiveness of the detection device.
If the IR detection device is N detection devices, having N bandpass filters (N 3) and the IR illuminator has an illuminator covering a wavelength range of the filters, the intensity value comprises:
4Φ = α I s (A) [laa) R 0 (À) + ηΕ β (λ) 1Τ £ (λ) 0 (λ) άλ + I b , (2) where i = 1 ... N, so that i is the i th IR bandpass filter of the illuminator that is lighting sequentially, α is a constant that depends on an angle and distance from the light source, an attenuation of an IR wave in the air and an integration time of the detection device θ the intensity of the light source, l b is a background intensity, R 0 (A) is the reflectance of an object inside the vehicle, Rg (A) and Tg (A) are a reflectance and of a transmittance of the glass, η constant is a measure 25 the percentage of light reflected illuminator vehicle glass and received by the detector, T 'L (a) is a transmittance of the i filter and D th (a) is the responsivity of the device detection. Any of the pixel intensity values can be combined to generate one or more new intensity values for that pixel and processed accordingly.
In step 204, a first pixel of the IR image is retrieved for processing. The first pixel can be selected automatically
7/13 by a processor or be identified by a user using a graphical user interface, such as, for example, a keyboard, mouse and monitor associated with a computer workstation, where various aspects of the present method are intended to be carried out. The user can select a first region of the received IR image and some or all of the pixels within the selected area of the processed image in front of other portions of the IR image. One or more IR images of the moving vehicle can be captured for processing accordingly.
In step 206, the identified pixel is classified according to a better reflectance by adjustment or a correlation method. In the correlation method, estimated intensity values for a plurality of known materials are retrieved from a storage device or a remote device via a network connection. A correlation coefficient is calculated between the pixel intensity value and the recovered intensity values. The pixel is then classified based on a computed correlation between pixel intensity values and the intensity values for these known materials. In one embodiment, the correlation coefficient is given by:
<3) where Al C m (i) = lcm (i) - lm is an intensity difference measure ΔΙ ^ (ϊ) = l cs (i) - I s is a calculated EL intensities difference is a calculated correlation 1 m , so that if the intensity calculated with a particular reflectance agrees with the object at the measured intensity, the correlation will be high (close to 1), otherwise the correlation will be small (close to 0 or negative).
In the best-fit reflectance method, reflectance values that have been estimated for a plurality of known materials are retrieved from a storage device or a remote device over a network. The intensity values are calculated 8/13 for those known materials and the pixel is classified based on a better reflectance adjustment.
In step 208, once the current pixel has been classified, a determination is made as to whether any pixels remain to be processed. If so, then, the processing is repeated with respect to step 204, in which a next pixel is retrieved, selected or otherwise identified for processing. Processing repeats until all desired pixels in the IR image have been processed.
In step 218, a total number of objects in the motor vehicle is then determined based on the pixel classifications. Then, further processing of the IR image, in this mode, ends.
Ί: Since the pixels in the image can be separated from surrounding non-human objects, neural networks or fuzzy logic can be employed to facilitate a determination of the number of objects (living or non-living) in the vehicle. In one embodiment, this is achieved by isolating, spatially, human beings identified in each of one or more Irs images, taken by the target vehicle's image formation system and counting the number of objects. If three IR cameras are employed, such as, for example, one facing the front of the moving vehicle to capture an image of the front passenger compartment and one facing each side of the vehicle to capture an image of the passenger and the driver's side of the vehicle. vehicle, each image can be analyzed to determine objects present. In an example implementation, if the number of human occupants in the motor vehicle exceeds a predetermined number during the time of day when travel on an HOV track is restricted, the vehicle's license plate can be captured automatically using technology vehicle tag identification and a signal is sent to a traffic enforcement authority, indicating that a vehicle with the identified license plate number is using the HOV lane, without the required number of occupants.
It should be appreciated that your flowcharts are illustrative. One or more of the steps illustrated in any of the flowcharts can
9/13 be performed in a different order. Other operations, for example, can be added, modified, improved, condensed, integrated or consolidated with their stages. Their variations are intended to be within the scope of the attached claims. All portions of the flowcharts can be implemented partially or completely in hardware in conjunction with the machine executable instructions.
! A 4-band illuminating system was used. The LEDs used had a peak wavelength of 940 nm, 1070 nm, 1200 nm and 1550 nm. The energy spectrum for the LEDs is shown in figures 3-9. Near infrared (NIR) LEDs are now available on the market. These LEDs are made with several semiconductors, such as GaAs or InGaAsp and have a peak wavelength starting from near to visible (> 730 nm) up to the infrared wavelength (> 2000 nm). Figures 3-9 illustrate the spectrum of several InGaAsP infrared LEDs with peak wavelengths ranging from 810 nm to 1600 nm. These LEDs have good radiant energy, between a few mW to about 45 mW for high energies. Several LEDs with the same peak wavelength can be bundled together in a batch or group. This study assumes that each LED illuminator is energized equally (if not, then the ratios in the following tables can be adjusted). The reflectance of various materials, including those from human skin and the transmittance of the window are available in a wide range of published literature. The IR detection camera is commercially available and captures images from 900 nm to 1700 nm. Results using the correlation coefficients of Eq. (3) are shown in the tables in figures 10 and 11. The term 'C123' means the correlation coefficient with the sequential illuminator with only band 1 (peak wavelength 940 nm) , band 2 (1070 nm) and band 3 (1200 nm). Similar notations for other 3-band correlation coefficients. The term 'C4' remains for the correlation coefficient using all 4 bands. The table in figure 10 shows the correlation coefficients with η = 0. From the table in figure 10, it can be seen that most combinations of 3 bands and the 4 band system work (negative correlation coefficients 10/13 or small positive) except for the combination of 3 bands 1, 2 and 3. The table in figure 11 shows the correlation coefficients with η = 0.05. As can be seen, the quality of separation of the skin with the other materials is reduced from the case with η = 0. However, except for dusty glass and dark skin, the combination of C234 and the 4-band system are still reasonably good for separating. the skin from other materials. To better test its robustness, we add 20% white noise to the measured intensity ”and then we test the above correlation again. The results are shown in the fabela of figure 12. It is clear that for the C234 3-band and 4-band camera system, the classification method is strong with this noise level.
i Reference is now made to figure 13, which illustrates a block diagram of an example processing system capable of implementing various aspects of the present method shown and described in relation to the flowchart of figure 2.
i The embodiment of figure 13 is shown comprising a workstation 1304 in communication with the IR image receiver 1302 for receiving pixel intensity values from antenna 108 of the IR detection device 107 of figure 1 and for effecting bidirectional communication between computer 1304 and detection device 108. Computer 1304 has a monitor 1303 and User Interface 1305 to allow viewing of information for a user and to perform user input or selection. Computer 1304 is also communicating with network 1301 via a network communication interface (shown). Various portions of the captured IR image and / or pixel intensity values can be stored in a memory or frame device internal to workstation 1304 and can be communicated to a remote device via network 1301 for storage or other processing. A user can use the graphical user interface, for example, keyboard and monitor to identify or otherwise select pixels and / or areas of the IR image for processing or provide other user input required for its implementation. Pixels and / or regions
11/13 of interest identified or otherwise detected in the received IR image data can be retrieved from a remote device, such as an image processing system over network 1301. Desktop computer 1304 and receiver 1302 are in communication with the 1306 Image Processor.
The image processor 1306 is shown comprising a Buffer 1307 to obtain queue information regarding the received IR image, such as, for example, regions of interest within the image and the like, that have been selected or otherwise identified for processing: pixel. Buffer 1307 can also store retrieved data and mathematical formulas and representations to process pages and groups of pages in the manner described above. The intensity calculator 1308 receives data and information from the workstation 1304 about the variables necessary to perform the required calculations. The Pixel Identifier Module 1309 identifies the current pixel to be processed, as described above in relation to step 204. Module 1309 is in communication with monitor 1303 to present an exposure for the user to select which pixel in the visualized IR image is intended for be processed next. The user can select some or all of the displayed IR image for processing. In other modalities the image is automatically processed and it must be understood that these modalities are intended to be within the scope of the attached claims. The Pixel Classification Module 1310 is a processor with memory in communication with the Intensity Module 1308 to obtain the pixel intensity values for the current pixel and generate ,. Depending on the method implemented, a correlation coefficient and limit values or reflectance values and a reflectance for better adjustment and save the results to the 1311 storage device. Based on the method employed, the pixel is classified. Module 1308 still stores / retrieves values to / from storage device 1311 for retrieval by module 1310. Object Identification Module 1312 receives the classified pixel from module 1310 and identifies the pixel based on the classification. The classification
12/13 of that pixel is saved to the 1311 storage device.
It should be understood that any of the models and processing units of figure 13 are in switch 73 with storage device 1311 via courses shown or not shown and can store / retrieve data, parameter values, functions, pages, records, data and instructions of machine readable / executable programs required to perform their intended functions. Each of these modules is also communicating with workstation 1304 via courses (not shown) and may still be communicating with one or more remote devices via the 1301. network. It should be appreciated that some or all of the functionality ·· for any one of the modules can be made, in whole or in part, by internal components to the 1304 workstation or by a special purpose computer system. It should also be appreciated that several modules can designate one or more components that can, in turn, comprise software and / or hardware designed to perform the intended function. A plurality of modules can collectively perform a single function. Each module can have a specialized processor capable of executing machine-readable program instructions. A module can comprise a single piece of hardware, such as an ASIC, electronic circuit or special purpose processor. A plurality of modules can be executed by a single special purpose computer system or a plurality of special purpose computer systems in parallel. The connections between modules can include physical and logical connections. The modules can also include one or more software / hardware modules that can also comprise an operating system, drivers, device drivers and other devices, some or all of which can be connected via a network. It is also considered that one or more aspects of the present method can be implemented in a dedicated computer system and can also be put into practice in distributed computing environments, where tasks are performed by remote devices that are connected via a network.
13/13
One or more aspects of the methods described herein are intended to be incorporated into an article of manufacture, including one or more products of computer programs, having computer-usable or machine-readable means. The article of manufacture can be included in at least one device and storage readable by a machine architecture © another xerographic or image processing system, giving instructions for executable programs, capable of carrying out the methodology described in the flowcharts. Additionally, the article of manufacture can be included as part of a xerographic system, an operating system, a plug-in or can be shipped, sold, rented or otherwise provided separately, alone or as part of an accessory, promotion, upgrade or set of products.
| It will be appreciated that several of the aspects and functions described above and others, or their alternatives, can be desirably combined in many other different systems or applications. Various alternatives, modifications, variations or improvements presently not provided for may become evident and / or subsequently made by those skilled in the art, who are also intended to be involved in the following claims. Consequently, the modalities presented above are considered to be illustrative and not limiting. Several changes in the modalities described above can be made without departing from the spirit and scope of the invention. The teachings of any
Printed publications, including patents and patent applications, are each separately incorporated herein by reference in their entirety.
1/6
权利要求:
Claims (13)
[1]
1. Method for determining the number of objects in an IR image obtained by an IR imaging system, the method comprising: |
- collect a total of N intensity values for each pixel in an IR image, said intensity values having been collected using an IR image formation system, comprising an IR detection device and an IR illuminator;
i - retrieve estimated intensity values for a plurality of materials known from a database;
i - determine a classification for each pixel in that IR image, using one of the best reflectance adjustment methods and the correlation method; and
- determine a total number of objects in said IR image based on said pixel classifications.
[2]
A method according to claim 1, wherein said IR detection device is a simple IR detection device and wherein said IR illuminator is an N-band illuminator that illuminates sequentially (N> 3 ) with a fixed filter, said intensity value comprising:
«0 = ag I · P) [Tá (V, R o (À) + n R o (A)] T L (A) D (A) dÀ + I b , where i = 1 ... N, de so i is the i th IR bandpass filter of the illuminator that is lighting sequentially, α is a constant that depends on an angle and distance from the light source, an attenuation of an IR wave in the air and an integration time of the detection device □ □ (□) is the intensity of the i th band of the light source, l b is an intensity of a backlight source, such as that of the IR component of sunlight, R 0 (A) is the reflectance of an object detected by said detection device, Rg (A) and Tg (A) are a reflectance and a transmittance of the glass, otherwise Rq (A) = 0 and Tq (A) = 1, constant η is a measure of the percentage of light from the illuminator reflected from the vehicle glass and received by the detector, otherwise η is zero, T L (A) is a fixed filter transmittance and D (A) is the res2 / 6 responsiveness of the detection.
[3]
3. Method according to claim 2, wherein said correlation method comprises: j c Σ! Ξι where AI C m (i) = 1 cm (i) - 1 m is a measured difference in intensity, Álcs ( i) = lcs (i) - I s is a difference in calculated intensity and l s is a calculated correspondence l m , so that if the intensity calculated with a particular reflectance agrees with the object at the measured intensity, the correlation will be high ( close to 1), otherwise the correlation will be small (close to 0 or negative); and
J
I - classification of said pixel based on an amount of said correlation.
[4]
4. Method, according to claim 2, wherein said method gives reflectance by better adjustment comprises:
- cross-reference to an intensity value associated with said pixels with less an intensity value calculated using a known reflectance retrieved from a database; and
- classification of said pixel based on reflectance for better adjustment.
[5]
A method according to claim 1, wherein said IR detection device comprises N detection devices having N bandpass filters (N> 3), and said IR illuminator has an illuminator covering a range of wavelength of said filters, said intensity value comprising:
: 4 (0 = «Jj I s G) [lg (Ã) R o CAJ + where i = 1 ... N, so that i is the i th IR bandpass filter α is a constant that depends on an angle and distance from the light source, an attenuation of an IR wave in the air and an integration time of the detection device M is the intensity of the light source band, L is an intensity of a backlight source, R 0 (À) is the reflectance of an object detected by said detection device, R G (A) and T G (A) are a
3/6 reflectance and glass transmittance, otherwise R G (A) = 0 and T G (A) = 1, constant η is a measure of the percentage of illuminator light reflected from the vehicle glass and received by the detector otherwise η is zero, Τί (λ) is a transmittance of the i th fixed filter and D (a) is the responsivity of said detection device.
[6]
A method according to claim 5, wherein said correlation method comprises:
Σ5ί '[Δ / ™ ®] [ΔΖ „(ί)] i C = —————---— ---- where Álcm (i) = l cm (i) - lm® a measured intensity difference , Alcs (i) = l C s (i) - I s is a calculated difference in intensity and l s is a calculated correspondence l m , so that if the calculated intensity with a particular reflectance agrees with the object at the measured intensity , the correlation will be high (close to 1), otherwise, the correlation will be small (close to 0 or negative); and
- classification of said pixel based on an amount of said correlation.
[7]
7. The method of claim 5, wherein said best-fit reflectance method comprises:
- cross-reference to an intensity value associated with said pixels with less an intensity value calculated using a known reflectance retrieved from a database; and
- classification of said pixel based on reflectance for better adjustment.
[8]
8. System for determining the number of objects in an IR image, the system comprising:
- an imaging system comprising an IR detection device and an IR illuminator;
- a memory and a storage medium; and
- a processor in communication with said storage medium and said memory, said processor executing installation
4/6 machine-readable instructions to perform the method of:
- collecting a total of N intensity values for each pixel in an IR image, using an image formation system;
- recovery of estimated intensity values for a plurality of materials known from a database;
- determination of a classification for each pixel in the referred IR image, using one among a method of better adjustment of a reflectance and a correlation method; and i - determination of a total number of objects in said IR image based on said pixel classifications.
i
[9]
A system according to claim 8, wherein said IR detection device is a simple IR detection device and in which the referred IR illuminator is an N-band illuminator, which even illuminates (N> 3) with a fixed filter, said intensity value comprising:
! 4 (0 = af 2 1 | O0 [lÍ (Ã) Roa) + qRG (Ã) M (I) D (®dà + Ib, where i = 1 ... N, so that i is the i th filter of IR band pass of the illuminatorj that is lighting sequentially, α is a constant that depends on an angle and distance from the light source, an attenuation of an IR wave in the air and an integration time of the OCO detection device is the intensity of the i th band of the light source, lb is an intensity of a backlight source, such as that of the IR component of sunlight, R 0 (A) is the reflectance of an object detected by said IR detection device, R G (A) and T G (A) are a reflectance and transmittance of the glass, otherwise R G (A) = 0 and T G (A) = 1, constant η is a measure of the light percentage of the illuminator reflected from the vehicle glass and received by the detector, otherwise η is zero, T L (A) is a fixed filter transmittance and
D (A) is the responsiveness of said detection device.
[10]
A system according to claim 9, wherein said correlation method comprises:
5/6 [M s (0F where ÁlcmO) = l cm (i) - lm is a difference in measured intensity, ΔΙ κ (ϊ) = l cs (i)
- an intensity I s is calculated difference s is a match el calculated Π m
so that if the intensity calculated with a particular reflectance agrees with the object at the measured intensity, the correlation will be high (close to 1), otherwise, the correlation will be small (close to 0 or negative);
- classification of said pixel based on an amount of said correlation.
;
[11]
11. The system according to claim 8, wherein said reflectance method by best adjustment comprises:
; - cross-referencing an intensity value associated with said pixels with at least one intensity value calculated using a known reflectance retrieved from a database; and ! - classification of said pixel based on reflectance for better adjustment.
[12]
A system according to claim 8, wherein said IR detection device comprises N detection devices having N bandpass filters (N> 3), and said IR illuminator has an illuminator covering a range of wavelength of said filters, said intensity value comprising:
| 4 © = α I s GOCIgmw + ηΒα (Λ)] Ίί (λ) Ο (λ) A + I b where i = 1 ... N, so i is the i th IR bandpass filter α is a constant that depends on an angle and distance from the light source, an attenuation of an IR wave in the air and a integration time of said detection device is the intensity of the light source band, l b is an intensity of a backlight source, R 0 (A) is a reflectance of an object detected by said IR detection device, R G (A) and T G (A) are a reflectance and transmittance of the glass, otherwise R G (A) = 0 and T G (A) = 1, constant η is a measure of the light percentage of the illuminator
6/6 reflected from the vehicle glass and received by said detector, otherwise η is zero, T ' L (À) is a transmittance of the fixed filter i tf1 and D (À) is the responsiveness of said detection device
[13]
A system according to claim 12, wherein said correlation method comprises:
(ί)] 0Σί · [Δ / „(ί)] 2 where Alcm (i) = l C m (i) - Imé is a measured intensity difference, Alcs (i) = Ι κ (ϊ) - I s is a calculated intensity difference and l s is a calculated correspondence l m , so that if the calculated intensity with a particular reflectance agrees with the object at the measured intensity, the correlation will be high (close to 1), otherwise the correlation will be small (close to 0 or negative); ie:
; - classification of said pixel based on an amount of said correlation.
; 14. The system according to claim 12, wherein said reflectance method for better adjustment comprises:
- cross-referencing an intensity value associated with said pixels with at least one intensity value calculated using a known reflectance retrieved from a database; and
- classification of said pixel based on reflectance by best adjuster.
1/10
类似技术:
公开号 | 公开日 | 专利标题
BR102012008481A2|2018-09-18|determining a number of objects in an image go
BRPI1107116A2|2016-03-22|determining a total number of people in an image go obtained through an image go system
KR101940955B1|2019-01-21|Apparatus, systems and methods for improved facial detection and recognition in vehicle inspection security systems
BR102012031758B1|2021-06-01|METHOD AND SYSTEM FOR DETERMINING A THRESHOLD FOR PIXEL CLASSIFICATION
US7602942B2|2009-10-13|Infrared and visible fusion face recognition system
US7469060B2|2008-12-23|Infrared face detection and recognition system
US9907138B2|2018-02-27|Occupancy sensing smart lighting system
Chang et al.2008|Multispectral visible and infrared imaging for face recognition
JP4814655B2|2011-11-16|Road surface state determination device and road surface state determination method
JP2006242909A|2006-09-14|System for discriminating part of object
JP6042555B2|2016-12-14|Object recognition in low and high lux conditions
AU2013319970A1|2015-04-02|Detecting a target in a scene
Bar et al.2010|Target detection and verification via airborne hyperspectral and high-resolution imagery processing and fusion
CN105383369A|2016-03-09|Method and device for adapting a brightness of a headlight for a vehicle
Hao et al.2010|A near-infrared imaging method for capturing the interior of a vehicle through windshield
Daley et al.2013|Detection of vehicle occupants in HOV lanes: exploration of image sensing for detection of vehicle occupants
Bertozzi et al.2007|A night vision module for the detection of distant pedestrians
Daley et al.2011|Sensing system development for HOV/HOT | lane monitoring.
CN110135235A|2019-08-16|A kind of dazzle processing method, device and vehicle
WO2021074122A1|2021-04-22|An image processing method
Yoon et al.2006|Evaluation of vision based in-vehicle applications
Singh et al.2021|Efficient method for real-time range enhancement of electro-optical imaging system
Pavlidis et al.1999|Automatic passenger counting in the HOV lane
同族专利:
公开号 | 公开日
DE102012206078A1|2012-10-18|
MX2012004179A|2012-10-25|
US20120262577A1|2012-10-18|
KR101924647B1|2018-12-03|
CN102842063A|2012-12-26|
JP2012220491A|2012-11-12|
KR20120116882A|2012-10-23|
CN102842063B|2015-04-08|
US8587657B2|2013-11-19|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JP2749191B2|1990-11-06|1998-05-13|新川電機株式会社|How to count the number of people passing by height|
US8078263B2|2000-01-19|2011-12-13|Christie Medical Holdings, Inc.|Projection of subsurface structure onto an object's surface|
DE10034976B4|2000-07-13|2011-07-07|iris-GmbH infrared & intelligent sensors, 12459|Detecting device for detecting persons|
JP3987013B2|2003-09-01|2007-10-03|本田技研工業株式会社|Vehicle periphery monitoring device|
US7460696B2|2004-06-01|2008-12-02|Lumidigm, Inc.|Multispectral imaging biometrics|
US20060020212A1|2004-07-26|2006-01-26|Tianning Xu|Portable vein locating device|
US7469060B2|2004-11-12|2008-12-23|Honeywell International Inc.|Infrared face detection and recognition system|
JP2007213182A|2006-02-08|2007-08-23|Fujifilm Corp|Object status recognition method, device, and program|
US7899217B2|2006-07-19|2011-03-01|Lumidign, Inc.|Multibiometric multispectral imager|
WO2008136644A2|2007-05-07|2008-11-13|Innozest Inc.|Apparatus and method for recognizing subcutaneous vein pattern|
CN101527039B|2008-03-06|2011-12-28|河海大学|Automatic image registration and rapid super-resolution fusion method based on edge feature|
US7869732B2|2008-07-03|2011-01-11|Xerox Corporation|Amplitude modulation of illuminators in sensing applications in printing system|US8818030B2|2011-12-13|2014-08-26|Xerox Corporation|Post-processing a multi-spectral image for enhanced object identification|
US9351649B2|2012-02-21|2016-05-31|Xerox Corporation|System and method for determining video-based pulse transit time with time-series signals|
US9007438B2|2012-05-21|2015-04-14|Xerox Corporation|3D imaging using structured light for accurate vehicle occupancy detection|
WO2014064898A1|2012-10-26|2014-05-01|日本電気株式会社|Device, method and program for measuring number of passengers|
US9523608B2|2012-12-14|2016-12-20|Xerox Corporation|Material identification from a spectral filtered patterned image without demosaicing|
US20140240511A1|2013-02-25|2014-08-28|Xerox Corporation|Automatically focusing a spectral imaging system onto an object in a scene|
WO2014172581A1|2013-04-17|2014-10-23|Hovtag Llc|Managing vehicular traffic on a roadway|
US9377294B2|2013-06-18|2016-06-28|Xerox Corporation|Handheld cellular apparatus for volume estimation|
TWI532620B|2013-06-24|2016-05-11|Utechzone Co Ltd|Vehicle occupancy number monitor and vehicle occupancy monitoring method and computer readable record media|
US9336594B2|2014-03-07|2016-05-10|Xerox Corporation|Cardiac pulse rate estimation from source video data|
US9320440B2|2014-04-01|2016-04-26|Xerox Corporation|Discriminating between atrial fibrillation and sinus rhythm in physiological signals obtained from video|
US9521335B2|2014-06-17|2016-12-13|Xerox Corporation|Detecting febrile seizure with a thermal video camera|
US10192134B2|2014-06-30|2019-01-29|Microsoft Technology Licensing, Llc|Color identification using infrared imaging|
DE102014223741A1|2014-11-20|2016-05-25|Conti Temic Microelectronic Gmbh|Detecting terahertz radiation to assist a driver of a vehicle|
US9836849B2|2015-01-28|2017-12-05|University Of Florida Research Foundation, Inc.|Method for the autonomous image segmentation of flow systems|
CN104834932A|2015-04-29|2015-08-12|河南城建学院|Matlab algorithm of automobile license plate identification|
WO2017070920A1|2015-10-30|2017-05-04|Microsoft Technology Licensing, Llc|Spoofed face detection|
US10701244B2|2016-09-30|2020-06-30|Microsoft Technology Licensing, Llc|Recolorization of infrared image streams|
US10175030B2|2017-03-13|2019-01-08|Sensors Unlimited, Inc.|Threat detection|
US10949679B2|2017-09-28|2021-03-16|Apple Inc.|Nighttime sensing|
CN109186772B|2018-08-22|2020-03-10|珠海格力电器股份有限公司|Human body judgment method based on infrared detector and electric appliance|
CN109286848B|2018-10-08|2020-08-04|腾讯科技(深圳)有限公司|Terminal video information interaction method and device and storage medium|
法律状态:
2018-09-18| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]|
2018-09-25| B11A| Dismissal acc. art.33 of ipl - examination not requested within 36 months of filing|
2018-12-11| B11Y| Definitive dismissal - extension of time limit for request of examination expired [chapter 11.1.1 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US13/086,006|US8587657B2|2011-04-13|2011-04-13|Determining a number of objects in an IR image|
[返回顶部]